UCL logo
skip to navigation. skip to content.

Gatsby Computational Neuroscience Unit




UCL Home
  • UCL Home
  • UCL Gatsby Computational Neuroscience Unit
UCL Gatsby Unit
  • introduction
  • people
  • research
  • publications
  • courses
  • phd programme
  • events
  • directions
  • greater gatsby
  • vacancies
  • Internal
  • ucl

 

 

  • Home
  • Staff & Students
  • Vacancies

 

Joan Bruna

 

Monday 21st March 2016

Time: 11.00am

 

Ground Floor Seminar Room

25 Howland Street, London, W1T 4JG

 

Convolutional Neural Networks against the curse of dimensionality.

 

 

Convolutional Neural Networks (CNN) are a powerful class of non-linear representations that have shown through numerous supervised learning tasks their ability to extract rich information from images, speech and text, with excellent statistical generalization. These are examples of truly high-dimensional signals, in which classical statistical models suffer from the so-called curse of dimensionality, referring to their inability to generalize well unless provided with exponentially large amounts of training data.

In order to gain insight into the reasons for such success, in this talk we will start by studying statistical models defined from wavelet scattering networks, a class of CNNs where the convolutional filter banks are given by complex, multi-resolution wavelet families. As a result of this extra structure, they are provably stable and locally invariant signal representations, and yield state-of-the-art classification results on several pattern and texture recognition problems. The reasons for such success lie on their ability to preserve discriminative information while being stable with respect to high-dimensional deformations, providing a framework that partially extends to trained CNNs.

We will give conditions under which signals can be recovered from their scattering coefficients, and will introduce a family of Gibbs processes defined by a collection of scattering CNN sufficient statistics, from which one can sample image and auditory textures. Although the scattering recovery is non-convex and corresponds to a generalized phase recovery problem, gradient descent algorithms show good empirical performance and enjoy weak convergence properties. We will discuss connections with non-linear compressed sensing, applications to texture synthesis and inverse problems such as super-resolution, as well as generalizations to unsupervised learning using deep convolutional sufficient statistics.

 

 

 

 

 

  • Disclaimer
  • Freedom of Information
  • Accessibility
  • Privacy
  • Advanced Search
  • Contact Us
Gatsby Computational Neuroscience Unit - Alexandra House - 17 Queen Square - London - WC1N 3AR - Telephone: +44 (0)20 7679 1176

© UCL 1999–20112011